113 research outputs found

    Computing Preferred Answer Sets by Meta-Interpretation in Answer Set Programming

    Full text link
    Most recently, Answer Set Programming (ASP) is attracting interest as a new paradigm for problem solving. An important aspect which needs to be supported is the handling of preferences between rules, for which several approaches have been presented. In this paper, we consider the problem of implementing preference handling approaches by means of meta-interpreters in Answer Set Programming. In particular, we consider the preferred answer set approaches by Brewka and Eiter, by Delgrande, Schaub and Tompits, and by Wang, Zhou and Lin. We present suitable meta-interpreters for these semantics using DLV, which is an efficient engine for ASP. Moreover, we also present a meta-interpreter for the weakly preferred answer set approach by Brewka and Eiter, which uses the weak constraint feature of DLV as a tool for expressing and solving an underlying optimization problem. We also consider advanced meta-interpreters, which make use of graph-based characterizations and often allow for more efficient computations. Our approach shows the suitability of ASP in general and of DLV in particular for fast prototyping. This can be fruitfully exploited for experimenting with new languages and knowledge-representation formalisms.Comment: 34 pages, appeared as a Technical Report at KBS of the Vienna University of Technology, see http://www.kr.tuwien.ac.at/research/reports

    Aerojet - Attitude Control Engines

    Get PDF
    All the engines were both qualification and acceptance tested at Marquardt s facilities. After we won the Apollo Program contract, we went off and built two vacuum test facilities, which simulated altitude continuous firing for as long as we wanted to run an engine. They would run days and days with the same capability we had on steam ejection. We did all of the testing in both for the qualification and the acceptance test. One of them was a large ball, which was an eighteen-foot diameter sphere, evacuated again with a big steam ejector system that could be used for system testing; that s where we did the Lunar Excursion Module testing. We put the whole cluster in there and tested the entire cluster at the simulated altitude conditions. The lowest altitude we tested at - typically an acceptance test - was 105,000 feet simulated altitude. The big ball - because people were interested in what they called goop formation, which is an unburned hydrazine product migrating to cold surfaces on different parts of spacecraft - was built to address those kinds of issues. We ran long-life tests in a simulated space environment with the entire inside of the test cell around the test article, liquid nitrogen cooled, so it could act as getter for any of the exhaust products. That particular facility could pull down to about 350,000 feet (atmosphere) equivalent altitude, which was pushing pretty close to the thermodynamic triple point of the MMH. It was a good test facility. Those facilities are no longer there. When the guys at Marquardt sold the company to what eventually became part of Aerojet, all those test facilities were cut off at the roots. I think they have a movie studio there at this point. That part of it is truly not recoverable, but it did some excellent high-altitude, space-equivalent testing at the time. Surprisingly, we had very few problems while testing in the San Fernando Valley. In the early 1960s, nobody had ever seen dinitrogen tetroxide (N2O4), so that wasn't too big a deal. We really did only make small, red clouds. In all the hundreds of thousands of tests and probably well over one million firings that I was around that place for, in all that thirty-something years, we had a total of one serious injury associated with rocket engine testing and propellants. Because we were trying to figure out what propellants would really be good, we tried all of the fun stuff like the carbon tetrafluoride, chlorine pentafluoride, and pure fluorine. The materials knowledge wasn't all that great at the time. On one test, the fluorine we had didn't react well with the copper they were using for tubing, and it managed to cause another unscheduled disassembly of the facility. It was very serious. It's like one of those Korean War stories. The technician happened to be walking past the test facility when it decided to blow itself up. A piece of copper tubing pierced one cheek and came out the other. That was the only serious accident in all of the engines handled in all those years. Now, we did have a problem with the EPA later because they figured out what the brown clouds were about. We built a whole bunch of exhaust mitigation scrubbers to take care of engine testing in the daytime. In general, we operated the big shuttle (RCS) engine, the 870- pounder at nominal conditions; they scrubbed the effluents pretty well. If you operated that same 870-pound force engine at a level where you get a lot of excess oxidizer, yeah, there s a brown cloud. But, you know, it doesn't show up well in the dark. They did do some of that. But, that s gone; it was addressed one way or another. RELEASED

    A Logic Programming Approach to Knowledge-State Planning: Semantics and Complexity

    Full text link
    We propose a new declarative planning language, called K, which is based on principles and methods of logic programming. In this language, transitions between states of knowledge can be described, rather than transitions between completely described states of the world, which makes the language well-suited for planning under incomplete knowledge. Furthermore, it enables the use of default principles in the planning process by supporting negation as failure. Nonetheless, K also supports the representation of transitions between states of the world (i.e., states of complete knowledge) as a special case, which shows that the language is very flexible. As we demonstrate on particular examples, the use of knowledge states may allow for a natural and compact problem representation. We then provide a thorough analysis of the computational complexity of K, and consider different planning problems, including standard planning and secure planning (also known as conformant planning) problems. We show that these problems have different complexities under various restrictions, ranging from NP to NEXPTIME in the propositional case. Our results form the theoretical basis for the DLV^K system, which implements the language K on top of the DLV logic programming system.Comment: 48 pages, appeared as a Technical Report at KBS of the Vienna University of Technology, see http://www.kr.tuwien.ac.at/research/reports

    Single-Beat Noninvasive Imaging of Ventricular Endocardial and Epicardial Activation in Patients Undergoing CRT

    Get PDF
    BACKGROUND: Little is known about the effect of cardiac resynchronization therapy (CRT) on endo- and epicardial ventricular activation. Noninvasive imaging of cardiac electrophysiology (NICE) is a novel imaging tool for visualization of both epi- and endocardial ventricular electrical activation. METHODOLOGY/PRINCIPAL FINDINGS: NICE was performed in ten patients with congestive heart failure (CHF) undergoing CRT and in ten patients without structural heart disease (control group). NICE is a fusion of data from high-resolution ECG mapping with a model of the patient's individual cardiothoracic anatomy created from magnetic resonance imaging. Beat-to-beat endocardial and epicardial ventricular activation sequences were computed during native rhythm as well as during ventricular pacing using a bidomain theory-based heart model to solve the related inverse problem. During right ventricular (RV) pacing control patients showed a deterioration of the ventricular activation sequence similar to the intrinsic activation pattern of CHF patients. Left ventricular propagation velocities were significantly decreased in CHF patients as compared to the control group (1.6±0.4 versus 2.1±0.5 m/sec; p<0.05). CHF patients showed right-to-left septal activation with the latest activation epicardially in the lateral wall of the left ventricle. Biventricular pacing resulted in a resynchronization of the ventricular activation sequence and in a marked decrease of total LV activation duration as compared to intrinsic conduction and RV pacing (129±16 versus 157±28 and 173±25 ms; both p<0.05). CONCLUSIONS/SIGNIFICANCE: Endocardial and epicardial ventricular activation can be visualized noninvasively by NICE. Identification of individual ventricular activation properties may help identify responders to CRT and to further improve response to CRT by facilitating a patient-specific lead placement and device programming

    The DLV System for Knowledge Representation and Reasoning

    Full text link
    This paper presents the DLV system, which is widely considered the state-of-the-art implementation of disjunctive logic programming, and addresses several aspects. As for problem solving, we provide a formal definition of its kernel language, function-free disjunctive logic programs (also known as disjunctive datalog), extended by weak constraints, which are a powerful tool to express optimization problems. We then illustrate the usage of DLV as a tool for knowledge representation and reasoning, describing a new declarative programming methodology which allows one to encode complex problems (up to Δ3P\Delta^P_3-complete problems) in a declarative fashion. On the foundational side, we provide a detailed analysis of the computational complexity of the language of DLV, and by deriving new complexity results we chart a complete picture of the complexity of this language and important fragments thereof. Furthermore, we illustrate the general architecture of the DLV system which has been influenced by these results. As for applications, we overview application front-ends which have been developed on top of DLV to solve specific knowledge representation tasks, and we briefly describe the main international projects investigating the potential of the system for industrial exploitation. Finally, we report about thorough experimentation and benchmarking, which has been carried out to assess the efficiency of the system. The experimental results confirm the solidity of DLV and highlight its potential for emerging application areas like knowledge management and information integration.Comment: 56 pages, 9 figures, 6 table

    Agroecological practices in combination with healthy diets can help meet EU food system policy targets

    Get PDF
    Agroecology has been proposed as a strategy to improve food system sustainability, but has also been criticised for using land inefficiently. We compared five explorative storylines, developed in a stakeholder process, for future food systems in the EU to 2050. We modelled a range of biophysical (e.g., land use and food production), environmental (e.g., greenhouse gas emissions) and social indicators, and potential for regional food self-sufficiency, and investigated the economic policy needed to reach these futures by 2050. Two contrasting storylines for upscaling agroecological practices emerged. In one, agroecology was implemented to produce high-value products serving high-income consumers through trade but, despite 40 of agricultural area being under organic management, only two out of eight EU environmental policy targets were met. As diets followed current trends in this storyline, there were few improvements in environmental indicators compared with the current situation, despite large-scale implementation of agroecological farming practices. This suggests that large-scale implementation of agroecological practices without concurrent changes on the demand side could aggravate existing environmental pressures. However, our second agroecological storyline showed that if large-scale diffusion of agroecological farming practices were implemented alongside drastic dietary change and waste reductions, major improvements on environmental indicators could be achieved and all relevant EU policy targets met. An alternative storyline comprising sustainable intensification in combination with dietary change and waste reductions was efficient in meeting targets related to climate, biodiversity, ammonia emissions, and use of antibiotics, but did not meet targets for reductions in pesticide and fertiliser use. These results confirm the importance of dietary change for food system climate change mitigation. Economic modelling showed a need for drastic changes in consumer preferences towards more plant-based, agroecological and local foods, and for improvements in technology, for these storylines to be realised, as very high taxes and tariffs would otherwise be needed

    Influence of water uptake on the aerosol particle light scattering coefficients of the Central European aerosol

    Get PDF
    The influence of aerosol water uptake on the aerosol particle light scattering was examined at the regional continental research site Melpitz, Germany. The scattering enhancement factor f(RH), defined as the aerosol particle scattering coefficient at a certain relative humidity (RH) divided by its dry value, was measured using a humidified nephelometer. The chemical composition and other microphysical properties were measured in parallel. f(RH) showed a strong variation, e.g. with values between 1.2 and 3.6 at RH=85% and λ=550 nm. The chemical composition was found to be the main factor determining the magnitude of f(RH), since the magnitude of f(RH) clearly correlated with the inorganic mass fraction measured by an aerosol mass spectrometer (AMS). Hysteresis within the recorded humidograms was observed and explained by long-range transported sea salt. A closure study using Mie theory showed the consistency of the measured parameters
    • …
    corecore